Goto

Collaborating Authors

 symmetric neural network



On the Dimension-Free Approximation of Deep Neural Networks for Symmetric Korobov Functions

Lu, Yulong, Mao, Tong, Xu, Jinchao, Yang, Yahong

arXiv.org Artificial Intelligence

Deep neural networks have been widely used as universal approximators for functions with inherent physical structures, including permutation symmetry. In this paper, we construct symmetric deep neural networks to approximate symmetric Korobov functions and prove that both the convergence rate and the constant prefactor scale at most polynomially with respect to the ambient dimension. This represents a substantial improvement over prior approximation guarantees that suffer from the curse of dimensionality. Building on these approximation bounds, we further derive a generalization-error rate for learning symmetric Korobov functions whose leading factors likewise avoid the curse of dimensionality. Keywords: Korobov spaces, deep neural networks, symmetric structure, curse of dimensionality 1. Introduction In this paper, we study quantitative approximation of symmetric functions using deep neural networks (DNNs).


Exponential Separations in Symmetric Neural Networks

Neural Information Processing Systems

In this work we demonstrate a novel separation between symmetric neural network architectures. Specifically, we consider the Relational Network \parencite{santoro2017simple} architecture as a natural generalization of the DeepSets \parencite{zaheer2017deep} architecture, and study their representational gap. Under the restriction to analytic activation functions, we construct a symmetric function acting on sets of size N with elements in dimension D, which can be efficiently approximated by the former architecture, but provably requires width exponential in N and D for the latter.